Minimax and Hamiltonian Dynamics of Excitatory-Inhibitory Networks

نویسندگان

  • H. Sebastian Seung
  • Tom J. Richardson
  • Jeffrey C. Lagarias
  • John J. Hopfield
چکیده

A Lyapunov function for excitatory-inhibitory networks is constructed. The construction assumes symmetric interactions within excitatory and inhibitory populations of neurons, and antisymmetric interactions between populations. The Lyapunov function yields su cient conditions for the global asymptotic stability of xed points. If these conditions are violated, limit cycles may be stable. The relations of the Lyapunov function to optimization theory and classical mechanics are revealed by minimax and dissipative Hamiltonian forms of the network dynamics. The dynamics of a neural network with symmetric interactions provably converges to xed points under very general assumptions[1, 2]. This mathematical result helped to establish the paradigm of neural computation with xed point attractors[3]. But in reality, interactions between neurons in the brain are asymmetric. Furthermore, the dynamical behaviors seen in the brain are not con ned to xed point attractors, but also include oscillations and complex nonperiodic behavior. These other types of dynamics can be realized by asymmetric networks, and may be useful for neural computation. For these reasons, it is important to understand the global behavior of asymmetric neural networks. The interaction between an excitatory neuron and an inhibitory neuron is clearly asymmetric. Here we consider a class of networks that incorporates this fundamental asymmetry of the brain's microcircuitry. Networks of this class have distinct populations of excitatory and inhibitory neurons, with antisymmetric interactions between populations and symmetric interactions within each population. Such networks display a rich repertoire of dynamical behaviors including xed points, limit cycles[4, 5] and traveling waves[6]. After de ning the class of excitatory-inhibitory networks, we introduce a Lyapunov function that establishes su cient conditions for the global asymptotic stability of xed points. The generality of these conditions contrasts with the restricted nature of previous convergence results, which applied only to linear networks[5], or to nonlinear networks with in nitely fast inhibition[7]. The use of the Lyapunov function is illustrated with a competitive or winner-take-all network, which consists of an excitatory population of neurons with recurrent inhibition from a single neuron[8]. For this network, the su cient conditions for global stability of xed points also happen to be necessary conditions. In other words, we have proved global stability over the largest possible parameter regime in which it holds, demonstrating the power of the Lyapunov function. There exists another parameter regime in which numerical simulations display limit cycle oscillations[7]. Similar convergence proofs for other excitatory-inhibitory networks may be obtained by tedious but straightforward calculations. All the necessary tools are given in the rst half of the paper. But the rest of the paper explains what makes the Lyapunov function especially interesting, beyond the convergence results it yields: its role in a conceptual framework that relates excitatory-inhibitory networks to optimization theory and classical mechanics. The connection between neural networks and optimization[3] was established by proofs that symmetric networks could ndminima of objective functions[1, 2]. Later it was discovered that excitatory-inhibitory networks could perform the minimax computation of nding saddle points[9, 10, 11], though no general proof of this was given at the time. Our Lyapunov function nally supplies such a proof, and one of its components is the objective function of the network's minimax computation. Our Lyapunov function can also be obtained by writing the dynamics of excitatoryinhibitory networks in Hamiltonian form, with extra velocity-dependent terms. If these extra terms are dissipative, then the energy of the system is nonincreasing, and is a Lyapunov function. If the extra terms are not purely dissipative, limit cycles are possible. Previous Hamiltonian formalisms for neural networks made the more restrictive assumption of purely antisymmetric interactions, and did not include the e ect of dissipation[12]. This paper establishes su cient conditions for global asymptotic stability of xed points. The problem of nding su cient conditions for oscillatory and chaotic behavior remains open. The perspectives of minimax and Hamiltonian dynamics may help in this task. 1 EXCITATORY-INHIBITORY NETWORKS The dynamics of an excitatory-inhibitory network is de ned by x _ x+ x = f(u+Ax By) ; (1) y _ y + y = g(v +B x Cy) : (2) The state variables are contained in two vectors x 2 R and y 2 R, which represent the activities of the excitatory and inhibitory neurons, respectively. The symbol f is used in both scalar and vector contexts. The scalar function f : R ! R is monotonic nondecreasing. The vector function f : R ! R is de ned by applying the scalar function f to each component of a vector argument, i.e., f(x) = (f(x1); : : : ; f(xm)). The symbol g is used similarly. The symmetry of interaction within each population is imposed by the constraints A = A and C = C . The antisymmetry of interaction between populations is manifest in the occurrence of B and B in the equations. The terms \excitatory" and \inhibitory" are appropriate with the additional constraint that the entries of matrices A, B, and C are nonnegative. Though this assumption makes sense in a neurobiological context the mathematics does not depends on it. The constant vectors u and v represent tonic input from external sources, or alternatively bias intrinsic to the neurons. The time constants x and y set the speed of excitatory and inhibitory synapses, respectively. In the limit of in nitely fast inhibition, y = 0, the convergence theorems for symmetric networks are applicable[1, 2], though some e ort is required in applying them to the case C 6= 0. If the dynamics converges for y = 0, then there exists some neighborhood of zero in which it still converges[7]. Our Lyapunov function goes further, as it is valid for more general y. The potential for oscillatory behavior in excitatory-inhibitory networks like (1) has long been known[4, 7]. The origin of oscillations can be understood from a simple two neuron model. Suppose that neuron 1 excites neuron 2, and receives inhibition back from neuron 2. Then the e ect is that neuron 1 suppresses its own activity with an e ective delay that depends on the time constant of inhibition. If this delay is long enough, oscillations result. However, these oscillations will die down to a xed point, as the inhibition tends to dampen activity in the circuit. Only if neuron 1 also excites itself can the oscillations become sustained. Therefore, whether oscillations are damped or sustained depends on the choice of parameters. In this paper we establish su cient conditions for the global stability of xed points in (1). The violation of these su cient conditions indicates parameter regimes in which there may be other types of asymptotic behavior, such as limit cycles.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

The Hamiltonian Brain: Efficient Probabilistic Inference with Excitatory-Inhibitory Neural Circuit Dynamics

Probabilistic inference offers a principled framework for understanding both behaviour and cortical computation. However, two basic and ubiquitous properties of cortical responses seem difficult to reconcile with probabilistic inference: neural activity displays prominent oscillations in response to constant input, and large transient changes in response to stimulus onset. Indeed, cortical mode...

متن کامل

Dichotomous Dynamics in E-I Networks with Strongly and Weakly Intra-connected Inhibitory Neurons

The interconnectivity between excitatory and inhibitory neural networks informs mechanisms by which rhythmic bursts of excitatory activity can be produced in the brain. One such mechanism, Pyramidal Interneuron Network Gamma (PING), relies primarily upon reciprocal connectivity between the excitatory and inhibitory networks, while also including intra-connectivity of inhibitory cells. The causa...

متن کامل

A neural mass model of CA1-CA3 neural network and studying sharp wave ripples

We spend one third of our life in sleep. The interesting point about the sleep is that the neurons are not quiescent during sleeping and they show synchronous oscillations at different regions. Especially sharp wave ripples are observed in the hippocampus. Here, we propose a simple phenomenological neural mass model for the CA1-CA3 network of the hippocampus considering the spike frequency adap...

متن کامل

Rewiring-Induced Chaos in Pulse-Coupled Neural Networks

The dependence of the dynamics of pulse-coupled neural networks on random rewiring of excitatory and inhibitory connections is examined. When both excitatory and inhibitory connections are rewired, periodic synchronization emerges with a Hopf-like bifurcation and a subsequent period-doubling bifurcation; chaotic synchronization is also observed. When only excitatory connections are rewired, per...

متن کامل

Neural networks with excitatory and inhibitory components: Direct and inverse problems by a mean-field approach.

We study the dynamics of networks with inhibitory and excitatory leak-integrate-and-fire neurons with short-term synaptic plasticity in the presence of depressive and facilitating mechanisms. The dynamics is analyzed by a heterogeneous mean-field approximation, which allows us to keep track of the effects of structural disorder in the network. We describe the complex behavior of different class...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1997